20 research outputs found

    Untenable nonstationarity: An assessment of the fitness for purpose of trend tests in hydrology

    Get PDF
    The detection and attribution of long-term patterns in hydrological time series have been important research topics for decades. A significant portion of the literature regards such patterns as ‘deterministic components’ or ‘trends’ even though the complexity of hydrological systems does not allow easy deterministic explanations and attributions. Consequently, trend estimation techniques have been developed to make and justify statements about tendencies in the historical data, which are often used to predict future events. Testing trend hypothesis on observed time series is widespread in the hydro-meteorological literature mainly due to the interest in detecting consequences of human activities on the hydrological cycle. This analysis usually relies on the application of some null hypothesis significance tests (NHSTs) for slowly-varying and/or abrupt changes, such as Mann-Kendall, Pettitt, or similar, to summary statistics of hydrological time series (e.g., annual averages, maxima, minima, etc.). However, the reliability of this application has seldom been explored in detail. This paper discusses misuse, misinterpretation, and logical flaws of NHST for trends in the analysis of hydrological data from three different points of view: historic-logical, semantic-epistemological, and practical. Based on a review of NHST rationale, and basic statistical definitions of stationarity, nonstationarity, and ergodicity, we show that even if the empirical estimation of trends in hydrological time series is always feasible from a numerical point of view, it is uninformative and does not allow the inference of nonstationarity without assuming a priori additional information on the underlying stochastic process, according to deductive reasoning. This prevents the use of trend NHST outcomes to support nonstationary frequency analysis and modeling. We also show that the correlation structures characterizing hydrological time series might easily be underestimated, further compromising the attempt to draw conclusions about trends spanning the period of records. Moreover, even though adjusting procedures accounting for correlation have been developed, some of them are insufficient or are applied only to some tests, while some others are theoretically flawed but still widely applied. In particular, using 250 unimpacted stream flow time series across the conterminous United States (CONUS), we show that the test results can dramatically change if the sequences of annual values are reproduced starting from daily stream flow records, whose larger sizes enable a more reliable assessment of the correlation structures

    Spatial And Temporal Modeling Of Radar Rainfall Uncertainties

    Get PDF
    It is widely acknowledged that radar-based estimates of rainfall are affected by uncertainties (e.g., mis-calibration, beam blockage, anomalous propagation, and ground clutter) which are both systematic and random in nature. Improving the characterization of these errors would yield better understanding and interpretations of results from studies in which these estimates are used as inputs (e.g., hydrologic modeling) or initial conditions (e.g., rainfall forecasting). Building on earlier efforts, the authors apply a data-driven multiplicative model in which the relationship between true rainfall and radar rainfall can be described in terms of the product of a systematic and random component. The systematic component accounts for conditional biases. The conditional bias is approximated by a power-law function. The random component, which represents the random fluctuations remaining after correcting for systematic uncertainties, is characterized in terms of its probability distribution as well as its spatial and temporal dependencies. The space-time dependencies are computed using the non-parametric Kendall\u27s τ measure. For the first time, the authors present a methodology based on conditional copulas to generate ensembles of random error fields with the prescribed marginal probability distribution and spatio-temporal dependencies. The methodology is illustrated using data from Clear Creek, which is a densely instrumented experimental watershed in eastern Iowa. Results are based on three years of radar data from the Davenport Weather Surveillance Radar 88 Doppler (WSR-88D) radar that were processed through the Hydro-NEXRAD system. The spatial and temporal resolutions are 0.5. km and hourly, respectively, and the radar data are complemented by rainfall measurements from 11 rain gages, located within the catchment, which are used to approximate true ground rainfall. © 2013 Elsevier B.V

    Complexity–entropy analysis of daily stream flow time series in the continental United States

    Get PDF
    Complexity–entropy causality plane (CECP) is a diagnostic diagram plotting normalized Shannon entropy HS versus Jensen–Shannon complexity CJS that has been introduced in nonlinear dynamics analysis to classify signals according to their degrees of randomness and complexity. In this study, we explore the applicability of CECP in hydrological studies by analyzing 80 daily stream flow time series recorded in the continental United States during a period of 75 years, surrogate sequences simulated by autoregressive models (with independent or long-range memory innovations), Theiler amplitude adjusted Fourier transform and Theiler phase randomization, and a set of signals drawn from nonlinear dynamic systems. The effect of seasonality, and the relationships between the CECP quantifiers and several physical and statistical properties of the observed time series are also studied. The results point out that: (1) the CECP can discriminate chaotic and stochastic signals in presence of moderate observational noise; (2) the signal classification depends on the sampling frequency and aggregation time scales; (3) both chaotic and stochastic systems can be compatible with the daily stream flow dynamics, when the focus is on the information content, thus setting these results in the context of the debate on observational equivalence; (4) the empirical relationships between HS and CJS and Hurst parameter H, base flow index, basin drainage area and stream flow quantiles highlight that the CECP quantifiers can be considered as proxies of the long-term low-frequency groundwater processes rather than proxies of the short-term high-frequency surface processes; (6) the joint application of linear and nonlinear diagnostics allows for a more comprehensive characterization of the stream flow time series.Centro de Investigaciones Óptica

    BetaBit: A fast generator of autocorrelated binary processes for geophysical research

    No full text
    We introduce a fast and efficient non-iterative algorithm, called BetaBit, to simulate autocorrelated binary processes describing the occurrence of natural hazards, system failures, and other physical and geophysical phenomena characterized by persistence, temporal clustering, and low rate of occurrence. BetaBit overcomes the simulation constraints posed by the discrete nature of the marginal distributions of binary processes by using the link existing between the correlation coefficients of this process and those of the standard Gaussian processes. The performance of BetaBit is tested on binary signals with power-law and exponentially decaying autocorrelation functions (ACFs) corresponding to Hurst-Kolmogorov and Markov processes, respectively. An application to real-world sequences describing rainfall intermittency and the occurrence of strong positive phases of the North Atlantic Oscillation (NAO) index shows that BetaBit can also simulate surrogate data preserving the empirical ACF as well as signals with autoregressive moving average (ARMA) dependence structures. Extensions to cyclo-stationary processes accounting for seasonal fluctuations are also discussed

    Complexity-entropy analysis of daily stream flow time series in the continental United States

    No full text
    Complexity–entropy causality plane (CECP) is a diagnostic diagram plotting normalized Shannon entropy HSHS versus Jensen–Shannon complexity CJSCJS that has been introduced in nonlinear dynamics analysis to classify signals according to their degrees of randomness and complexity. In this study, we explore the applicability of CECP in hydrological studies by analyzing 80 daily stream flow time series recorded in the continental United States during a period of 75 years, surrogate sequences simulated by autoregressive models (with independent or long-range memory innovations), Theiler amplitude adjusted Fourier transform and Theiler phase randomization, and a set of signals drawn from nonlinear dynamic systems. The effect of seasonality, and the relationships between the CECP quantifiers and several physical and statistical properties of the observed time series are also studied. The results point out that: (1) the CECP can discriminate chaotic and stochastic signals in presence of moderate observational noise; (2) the signal classification depends on the sampling frequency and aggregation time scales; (3) both chaotic and stochastic systems can be compatible with the daily stream flow dynamics, when the focus is on the information content, thus setting these results in the context of the debate on observational equivalence; (4) the empirical relationships between HSHS and CJSCJS and Hurst parameter H, base flow index, basin drainage area and stream flow quantiles highlight that the CECP quantifiers can be considered as proxies of the long-term low-frequency groundwater processes rather than proxies of the short-term high-frequency surface processes; (6) the joint application of linear and nonlinear diagnostics allows for a more comprehensive characterization of the stream flow time series.Fil: Serinaldi, Francesco. University of Newcastle; Reino UnidoFil: Zunino, Luciano José. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico La Plata. Centro de Investigaciones Opticas (i); Argentina. Provincia de Buenos Aires. Gobernación. Comisión de Investigaciones Científicas; Argentina. Universidad Nacional de La Plata. Facultad de Ingenieria; ArgentinaFil: Rosso, Osvaldo Anibal. Universidad de Buenos Aires. Facultad de Ingeniería; Argentina. Universidade Federal de Alagoas; Brasil. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentin

    A 3-copula function application for design hyetograph analysis

    No full text
    A design hyetograph is a synthetic rainfall temporal pattern associated with a return period. Usually it is determined by means of statistical analysis of observed rainfall mean intensity through intensity–duration– frequency (IDF) curves. The other characteristics of a rainfall event, such as the peak, total depth and duration, are found indirectly throughout several phases of hydrological analysis, and trigger suitable work assumptions. The aim of this paper is to apply a multivariate approach in order to analyse jointly observed data of rainfall critical depth, maximum intensity and total depth. In particular, bivariate analysis intensity-total depth conditioning to critical depth is developed using a 3-copula function to define the trivariate joint distribution function. Following the proposed procedure, once a design return period and related critical depth are selected, it is possible to determine, in a probabilistic way, the intensity and total depth, without advancing a priori hypotheses on the design hyetograph pattern. In a case study, the results obtained with the proposed procedure are compared with those deduced from standard design hyetographs usually applied in practical hydrological applications
    corecore